Minimizers of Cost-Functions Involving Nonsmooth Data-Fidelity Terms. Application to the Processing of Outliers
نویسنده
چکیده
We present a theoretical study of the recovery of an unknown vector x ∈ IR (a signal, an image) from noisy data y ∈ IR by minimizing with respect to x a regularized cost-function F(x, y) = Ψ(x, y) + αΦ(x), where Ψ is a data-fidelity term, Φ is a smooth regularization term and α > 0 is a parameter. Typically, Ψ(x, y) = ‖Ax − y‖ where A is a linear operator. The data-fidelity terms Ψ involved in regularized costfunctions are generally smooth functions; only a few papers make an exception and they consider restricted situations. Non-smooth data-fidelity terms are avoided in image processing. In spite of this, we consider both smooth and non-smooth data-fidelity terms. Our ambition is to catch essential features exhibited by the local minimizers of regularized cost-functions in relation with the smoothness of the data-fidelity term. In order to fix the context of our study, we consider Ψ(x, y) = ∑ i ψ(aTi x− yi), where a T i are the rows of A and ψ is C on IR\{0}. We show that if ψ′(0−) < ψ(0), typical data y give rise to local minimizers x̂ of F(., y) which fit exactly a certain number of the data entries: there is a possibly large set ĥ of indexes such that aTi x̂ = yi for every i ∈ ĥ. In contrast, if ψ is smooth on IR, for almost every y, the local minimizers of F(., y) do not fit any entry of y. Thus, the possibility that a local minimizer fits some data entries is due to the non-smoothness of the data-fidelity term. This is a strong mathematical property which is useful in practice. By way of application, we construct a cost-function allowing aberrant data (outliers) to be detected and to be selectively smoothed. Our numerical experiments advocate the use of non-smooth data-fidelity terms in regularized cost-functions for special purposes in image and signal processing.
منابع مشابه
Minimization of cost-functions with a non-smooth data-fidelity term. A new approach to the processing of impulsive noise
We consider signal and image restoration using convex cost-functions composed of a non-smooth data-fidelity term and a smooth regularization term. First, we provide a convergent method to minimize such cost-functions. Then we propose an efficient method to remove impulsive noise by minimizing cost-functions composed of an l1 data-fidelity term and an edge-preserving regularization term. Their m...
متن کاملStability of Minimizers of Regularized Least Squares Objective Functions I: Study of the Local Behavior
Abstract. Many estimation problems amount to minimizing an objective function composed of a quadratic data-fidelity term and a general regularization term. It is widely accepted that the minimizers obtained using nonsmooth and/or nonconvex regularization terms are frequently good estimates. However, very few facts are known on the ways to control properties of these minimizers. This work is ded...
متن کاملOn Sequential Optimality Conditions without Constraint Qualifications for Nonlinear Programming with Nonsmooth Convex Objective Functions
Sequential optimality conditions provide adequate theoretical tools to justify stopping criteria for nonlinear programming solvers. Here, nonsmooth approximate gradient projection and complementary approximate Karush-Kuhn-Tucker conditions are presented. These sequential optimality conditions are satisfied by local minimizers of optimization problems independently of the fulfillment of constrai...
متن کاملBounds on the Minimizers of (nonconvex) Regularized Least-Squares
This is a theoretical study on the minimizers of cost-functions composed of an `2 data-fidelity term and a possibly nonsmooth or nonconvex regularization term acting on the differences or the discrete gradients of the image or the signal to restore. More precisely, we derive general nonasymptotic analytical bounds characterizing the local and the global minimizers of these cost-functions. We fi...
متن کاملStability of Minimizers of Regularized Least Squares Objective Functions Ii: Study of the Global Behavior
We address estimation problems where the sought-after solution is defined as the minimizer of an objective function composed of a quadratic data-fidelity term and a regularization term. We especially focus on nonsmooth and/or nonconvex regularization terms because of their ability to yield good estimates. This work is dedicated to the stability of the minimizers of such nonsmooth and/or nonconv...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- SIAM J. Numerical Analysis
دوره 40 شماره
صفحات -
تاریخ انتشار 2002